Fixed budget quantized kernel least-mean-square algorithm

نویسندگان

  • Songlin Zhao
  • Badong Chen
  • Pingping Zhu
  • José Carlos Príncipe
چکیده

We present a quantization-based kernel least mean square (QKLMS) algorithm with a fixed memory budget. In order to deal with the growing support inherent in online kernel methods, the proposed method utilizes a growing and pruning combined technique and defines a criterion, significance, based on weighted statistical contribution of a data center. This method doesn’t need any apriori information and its computational complexity is acceptable, linear with the center number. As we show theoretically and experimentally, the introduced algorithm successfully quantifies the least ‘significant’ datum and preserves the most important ones resulting in less system error.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

FROM FIXED TO ADAPTIVE BUDGET ROBUST KERNEL ADAPTIVE FILTERING By SONGLIN ZHAO A DISSERTATION PRESENTED TO THE GRADUATE SCHOOL OF THE UNIVERSITY OF FLORIDA IN PARTIAL FULFILLMENT OF THE REQUIREMENTS FOR THE DEGREE OF DOCTOR OF PHILOSOPHY

of Dissertation Presented to the Graduate School of the University of Florida in Partial Fulfillment of the Requirements for the Degree of Doctor of Philosophy FROM FIXED TO ADAPTIVE BUDGET ROBUST KERNEL ADAPTIVE FILTERING By Songlin Zhao December 2012 Chair: Jose C. Principe Major: Electrical and Computer Engineering Recently, owning to universal modeling capacity, convexity in performance sur...

متن کامل

Online efficient learning with quantized KLMS and L1 regularization

In a recent work, we have proposed the quantized kernel least mean square (QKLMS) algorithm, which is quite effective in online learning sequentially a nonlinear mapping with a slowly growing radial basis function (RBF) structure. In this paper, in order to further reduce the network size, we propose a sparse QKLMS algorithm, which is derived by adding a sparsity inducing 1 l norm penalty of th...

متن کامل

Online Nonlinear Granger Causality Detection by Quantized Kernel Least Mean Square

Identifying causal relations among simultaneously acquired signals is an important challenging task in time series analysis. The original definition of Granger causality was based on linear models, its application to nonlinear systems may not be appropriate. We consider an extension of Granger causality to nonlinear bivariate time series with the universal approximation capacity in reproducing ...

متن کامل

A Quantized Kernel Learning Algorithm Using a Minimum Kernel Risk-Sensitive Loss Criterion and Bilateral Gradient Technique

Recently, inspired by correntropy, kernel risk-sensitive loss (KRSL) has emerged as a novel nonlinear similarity measure defined in kernel space, which achieves a better computing performance. After applying the KRSL to adaptive filtering, the corresponding minimum kernel risk-sensitive loss (MKRSL) algorithm has been developed accordingly. However, MKRSL as a traditional kernel adaptive filter...

متن کامل

Variable-mixing parameter quantized kernel robust mixed-norm algorithms for combating impulsive interference

Although the kernel robust mixed-norm (KRMN) algorithm outperforms the kernel least mean square (KLMS) algorithm in impulsive noise, it still has two major problems as follows: (1) The choice of the mixing parameter in the KRMN is crucial to obtain satisfactory performance. (2) The structure of KRMN grows linearly as the iteration goes on, thus it has high computational burden and memory requir...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:
  • Signal Processing

دوره 93  شماره 

صفحات  -

تاریخ انتشار 2013